Two-versions of descent conjugate gradient methods for large-scale unconstrained optimization

نویسندگان

چکیده

<p>The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly parameter is focus methods. The current paper proposes new type solve optimization. A Hessian approximation in a diagonal matrix form on basis second and third-order Taylor series expansion was employed this study. sufficient descent property proposed algorithm proved. method converged globally. This found competitive fletcher-reeves (FR) number numerical experiments.</p>

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear Conjugate Gradient Methods with Sufficient Descent Condition for Large-Scale Unconstrained Optimization

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and comp...

متن کامل

Another Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization

In this paper we suggest another accelerated conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as where , The coefficients 0 k ≥ 1 1 1 1 ( / ) ( / ) T T T T k k k k k k k k k k k k k k d g y g y s s t s g y s θ + + + + = − + − , s 1 1 ( ) k k g f x + + = ∇ 1 . k k k s x x + = − k θ and in this linear combinat...

متن کامل

Another accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization

In this paper we suggest another accelerated conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as where , The coefficients 0 k ≥ 1 1 1 1 ( / ) ( / ) T T T T k k k k k k k k k k k k k k d g y g y s s t s g y s θ + + + + = − + − , s 1 1 ( ) k k g f x + + = ∇ 1 . k k k s x x + = − k θ and in this linear combinat...

متن کامل

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

متن کامل

A New Sufficient Descent Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new conjugate conjugate method with sufficient descent property is proposed for the unconstrained optimization problem. An attractive property of the new method is that the descent direction generated by the method always possess the sufficient descent property, and this property is independent of the line search used and the choice of ki  . Under mild conditions, the global c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Indonesian Journal of Electrical Engineering and Computer Science

سال: 2021

ISSN: ['2502-4752', '2502-4760']

DOI: https://doi.org/10.11591/ijeecs.v22.i3.pp1643-1649